Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 31
1.
Comput Methods Programs Biomed ; 249: 108157, 2024 Jun.
Article En | MEDLINE | ID: mdl-38582037

BACKGROUND AND OBJECTIVE: T-wave alternans (TWA) is a fluctuation in the repolarization morphology of the ECG. It is associated with cardiac instability and sudden cardiac death risk. Diverse methods have been proposed for TWA analysis. However, TWA detection in ambulatory settings remains a challenge due to the absence of standardized evaluation metrics and detection thresholds. METHODS: In this work we use traditional TWA analysis signal processing-based methods for feature extraction, and two machine learning (ML) methods, namely, K-nearest-neighbor (KNN) and random forest (RF), for TWA detection, addressing hyper-parameter tuning and feature selection. The final goal is the detection in ambulatory recordings of short, non-sustained and sparse TWA events. RESULTS: We train ML methods to detect a wide variety of alternant voltage from 20 to 100 µV, i.e., ranging from non-visible micro-alternans to TWA of higher amplitudes, to recognize a wide range in concordance to risk stratification. In classification, RF outperforms significantly the recall in comparison with the signal processing methods, at the expense of a small lost in precision. Despite ambulatory detection stands for an imbalanced category context, the trained ML systems always outperform signal processing methods. CONCLUSIONS: We propose a comprehensive integration of multiple variables inspired by TWA signal processing methods to fed learning-based methods. ML models consistently outperform the best signal processing methods, yielding superior recall scores.


Arrhythmias, Cardiac , Electrocardiography, Ambulatory , Humans , Electrocardiography, Ambulatory/methods , Heart Rate , Arrhythmias, Cardiac/diagnosis , Death, Sudden, Cardiac , Signal Processing, Computer-Assisted , Electrocardiography/methods
2.
Med Biol Eng Comput ; 61(9): 2227-2240, 2023 Sep.
Article En | MEDLINE | ID: mdl-37010711

Noise and artifacts affect strongly the quality of the electrocardiogram (ECG) in long-term ECG monitoring (LTM), making some of its parts impractical for diagnosis. The clinical severity of noise defines a qualitative quality score according to the manner clinicians make the interpretation of the ECG, in contrast to assess noise from a quantitative standpoint. So clinical noise refers to a scale of different levels of qualitative severity of noise which aims at elucidating which ECG fragments are valid to achieve diagnosis from a clinical point of view, unlike the traditional approach, which assesses noise in terms of quantitative severity. This work proposes the use of machine learning (ML) techniques to categorize different qualitative noise severity using a database annotated according to a clinical noise taxonomy as gold standard. A comparative study is carried out using five representative ML methods, namely, K neareast neighbors, decision trees, support vector machine, single-layer perceptron, and random forest. The models are fed by signal quality indexes characterizing the waveform in time and frequency domains, as well as from a statistical viewpoint, to distinguish between clinically valid ECG segments from invalid ones. A solid methodology to prevent overfitting to both the dataset and the patient is developed, taking into account balance of classes, patient separation, and patient rotation in the test set. All the proposed learning systems have demonstrated good classification performance, attaining a recall, precision, and F1 score up to 0.78, 0.80, and 0.77, respectively, in the test set by a single-layer perceptron approach. These systems provide a classification solution for assessing the clinical quality of the ECG taken from LTM recordings. Graphical Abstract Clinical Noise Severity Classification based on Machine Learning techniques towards Long-Term ECG Monitoring.


Electrocardiography , Neural Networks, Computer , Humans , Electrocardiography/methods , Random Forest , Artifacts , Machine Learning , Algorithms
3.
Heliyon ; 9(1): e12947, 2023 Jan.
Article En | MEDLINE | ID: mdl-36699267

Background and objective: T-wave alternans (TWA) is a fluctuation of the ST-T complex of the surface electrocardiogram (ECG) on an every-other-beat basis. It has been shown to be clinically helpful for sudden cardiac death stratification, though the lack of a gold standard to benchmark detection methods limits its application and impairs the development of alternative techniques. In this work, a novel approach based on machine learning for TWA detection is proposed. Additionally, a complete experimental setup is presented for TWA detection methods benchmarking. Methods: The proposed experimental setup is based on the use of open-source databases to enable experiment replication and the use of real ECG signals with added TWA episodes. Also, intra-patient overfitting and class imbalance have been carefully avoided. The Spectral Method (SM), the Modified Moving Average Method (MMA), and the Time Domain Method (TM) are used to obtain input features to the Machine Learning (ML) algorithms, namely, K Nearest Neighbor, Decision Trees, Random Forest, Support Vector Machine and Multi-Layer Perceptron. Results: There were not found large differences in the performance of the different ML algorithms. Decision Trees showed the best overall performance (accuracy 0.88 ± 0.04 , precision 0.89 ± 0.05 , Recall 0.90 ± 0.05 , F1 score 0.89 ± 0.03 ). Compared to the SM (accuracy 0.79, precision 0.93, Recall 0.64, F1 score 0.76) there was an improvement in every metric except for the precision. Conclusions: In this work, a realistic database to test the presence of TWA using ML algorithms was assembled. The ML algorithms overall outperformed the SM used as a gold standard. Learning from data to identify alternans elicits a substantial detection growth at the expense of a small increment of the false alarm.

4.
Sensors (Basel) ; 22(20)2022 Oct 14.
Article En | MEDLINE | ID: mdl-36298178

Power line infrastructure is available almost everywhere. Positioning systems aim to estimate where a device or target is. Consequently, there may be an opportunity to use power lines for positioning purposes. This survey article reports the different efforts, working principles, and possibilities for implementing positioning systems relying on power line infrastructure for power line positioning systems (PLPS). Since Power Line Communication (PLC) systems of different characteristics have been deployed to provide communication services using the existing mains, we also address how PLC systems may be employed to build positioning systems. Although some efforts exist, PLPS are still prospective and thus open to research and development, and we try to indicate the possible directions and potential applications for PLPS.

5.
Sensors (Basel) ; 21(1)2021 Jan 04.
Article En | MEDLINE | ID: mdl-33406684

The aim of this paper is to formulate the physical layer of the broadband and narrowband power line communication (PLC) systems described in standards IEEE 1901 and IEEE 1901.2, which address new communication technologies over electrical networks for Smart Grid and Internet of Things applications. Specifically, this paper presents a mathematical formulation by means of matrices of a transmitter and receiver system based on windowed OFDM. The proposed formulation is essential for obtaining the input-output relation, as well as an analysis of the interference present in the system. It is very useful for simulating PLC systems using software designed to operate primarily on whole matrices and arrays, such as Matlab. In addition, it eases the analysis and design of different receiver configurations, simply by modifying or adding a matrix. Since the relevant standards only describe the blocks corresponding to the transmitter, and leave the set-up of the receiver open to the manufacturer, we analysed four different possible schemes that include window functions in different configurations. In simulations, the behaviour of each of these schemes is analysed in terms of bit error and achievable data rates using artificial and real noises.

6.
Sensors (Basel) ; 20(11)2020 May 29.
Article En | MEDLINE | ID: mdl-32485879

During the last years, attention and controversy have been present for the first commercially available equipment being used in Electrocardiographic Imaging (ECGI), a new cardiac diagnostic tool which opens up a new field of diagnostic possibilities. Previous knowledge and criteria of cardiologists using intracardiac Electrograms (EGM) should be revisited from the newly available spatial-temporal potentials, and digital signal processing should be readapted to this new data structure. Aiming to contribute to the usefulness of ECGI recordings in the current knowledge and methods of cardiac electrophysiology, we previously presented two results: First, spatial consistency can be observed even for very basic cardiac signal processing stages (such as baseline wander and low-pass filtering); second, useful bipolar EGMs can be obtained by a digital processing operator searching for the maximum amplitude and including a time delay. In addition, this work aims to demonstrate the functionality of ECGI for cardiac electrophysiology from a twofold view, namely, through the analysis of the EGM waveforms, and by studying the ventricular repolarization properties. The former is scrutinized in terms of the clustering properties of the unipolar an bipolar EGM waveforms, in control and myocardial infarction subjects, and the latter is analyzed using the properties of T-wave alternans (TWA) in control and in Long-QT syndrome (LQTS) example subjects. Clustered regions of the EGMs were spatially consistent and congruent with the presence of infarcted tissue in unipolar EGMs, and bipolar EGMs with adequate signal processing operators hold this consistency and yielded a larger, yet moderate, number of spatial-temporal regions. TWA was not present in control compared with an LQTS subject in terms of the estimated alternans amplitude from the unipolar EGMs, however, higher spatial-temporal variation was present in LQTS torso and epicardium measurements, which was consistent through three different methods of alternans estimation. We conclude that spatial-temporal analysis of EGMs in ECGI will pave the way towards enhanced usefulness in the clinical practice, so that atomic signal processing approach should be conveniently revisited to be able to deal with the great amount of information that ECGI conveys for the clinician.


Arrhythmias, Cardiac , Electrocardiography , Electrophysiologic Techniques, Cardiac , Arrhythmias, Cardiac/diagnosis , Body Surface Potential Mapping , Cluster Analysis , Humans
7.
Sensors (Basel) ; 20(11)2020 Jun 01.
Article En | MEDLINE | ID: mdl-32492938

During the last years, Electrocardiographic Imaging (ECGI) has emerged as a powerful and promising clinical tool to support cardiologists. Starting from a plurality of potential measurements on the torso, ECGI yields a noninvasive estimation of their causing potentials on the epicardium. This unprecedented amount of measured cardiac signals needs to be conditioned and adapted to current knowledge and methods in cardiac electrophysiology in order to maximize its support to the clinical practice. In this setting, many cardiac indices are defined in terms of the so-called bipolar electrograms, which correspond with differential potentials between two spatially close potential measurements. Our aim was to contribute to the usefulness of ECGI recordings in the current knowledge and methods of cardiac electrophysiology. For this purpose, we first analyzed the basic stages of conventional cardiac signal processing and scrutinized the implications of the spatial-temporal nature of signals in ECGI scenarios. Specifically, the stages of baseline wander removal, low-pass filtering, and beat segmentation and synchronization were considered. We also aimed to establish a mathematical operator to provide suitable bipolar electrograms from the ECGI-estimated epicardium potentials. Results were obtained on data from an infarction patient and from a healthy subject. First, the low-frequency and high-frequency noises are shown to be non-independently distributed in the ECGI-estimated recordings due to their spatial dimension. Second, bipolar electrograms are better estimated when using the criterion of the maximum-amplitude difference between spatial neighbors, but also a temporal delay in discrete time of about 40 samples has to be included to obtain the usual morphology in clinical bipolar electrograms from catheters. We conclude that spatial-temporal digital signal processing and bipolar electrograms can pave the way towards the usefulness of ECGI recordings in the cardiological clinical practice. The companion paper is devoted to analyzing clinical indices obtained from ECGI epicardial electrograms measuring waveform variability and repolarization tissue properties.


Body Surface Potential Mapping , Electrocardiography , Pericardium/physiology , Signal Processing, Computer-Assisted , Diagnostic Imaging , Humans
8.
Sensors (Basel) ; 18(11)2018 Nov 05.
Article En | MEDLINE | ID: mdl-30400587

In recent years, a number of proposals for electrocardiogram (ECG) monitoring based on mobile systems have been delivered. We propose here an STM32F-microcontroller-based ECG mobile system providing both long-term (several weeks) Holter monitoring and 12-lead ECG recording, according to the clinical standard requirements for these kinds of recordings, which in addition can yield further digital compression at stages close to the acquisition. The system can be especially useful in rural areas of developing countries, where the lack of specialized medical personnel justifies the introduction of telecardiology services, and the limitations of coverage and bandwidth of cellular networks require the use of efficient signal compression systems. The prototype was implemented using a small architecture, with a 16-bits-per-sample resolution. We also used a low-noise instrumentation amplifier TI ADS1198, which has a multiplexer and an analog-to-digital converter (16 bits and 8 channels) connected to the STM32F processor, the architecture of which incorporates a digital signal processing unit and a floating-point unit. On the one hand, the system portability allows the user to take the prototype in her/his pocket and to perform an ECG examination, either in 12-lead controlled conditions or in Holter monitoring, according to the required clinical scenario. An app in the smartphone is responsible for giving the users a friendly interface to set up the system. On the other hand, electronic health recording of the patients are registered in a web application, which in turn allows them to connect to the Internet from their cellphones, and the ECG signals are then sent though a web server for subsequent and ubiquitous analysis by doctors at any convenient terminal device. In order to determine the quality of the received signals, system testing was performed in the three following scenarios: (1) The prototype was connected to the patient and the signals were subsequently stored; (2) the prototype was connected to the patient and the data were subsequently transferred to the cellphone; (3) the prototype was connected to the patient, and the data were transferred to the cellphone and to the web via the Internet. An additional benchmarking test with expert clinicians showed the clinical quality provided by the system. The proposed ECG system is the first step and paves the way toward mobile cardiac monitors in terms of compatibility with the electrocardiographic practice, including the long-term monitoring, the usability with 12 leads, and the possibility of incorporating signal compression at the early stages of the ECG acquisition.


Electrocardiography/instrumentation , Signal Processing, Computer-Assisted , Telemedicine/instrumentation , Calibration , Cell Phone , Electrodes , Humans , Internet , Reproducibility of Results , Smartphone , Software
9.
Sensors (Basel) ; 18(5)2018 May 01.
Article En | MEDLINE | ID: mdl-29723990

Despite the wide literature on R-wave detection algorithms for ECG Holter recordings, the long-term monitoring applications are bringing new requirements, and it is not clear that the existing methods can be straightforwardly used in those scenarios. Our aim in this work was twofold: First, we scrutinized the scope and limitations of existing methods for Holter monitoring when moving to long-term monitoring; Second, we proposed and benchmarked a beat detection method with adequate accuracy and usefulness in long-term scenarios. A longitudinal study was made with the most widely used waveform analysis algorithms, which allowed us to tune the free parameters of the required blocks, and a transversal study analyzed how these parameters change when moving to different databases. With all the above, the extension to long-term monitoring in a database of 7-day Holter monitoring was proposed and analyzed, by using an optimized simultaneous-multilead processing. We considered both own and public databases. In this new scenario, the noise-avoid mechanisms are more important due to the amount of noise that exists in these recordings, moreover, the computational efficiency is a key parameter in order to export the algorithm to the clinical practice. The method based on a Polling function outperformed the others in terms of accuracy and computational efficiency, yielding 99.48% sensitivity, 99.54% specificity, 99.69% positive predictive value, 99.46% accuracy, and 0.85% error for MIT-BIH arrhythmia database. We conclude that the method can be used in long-term Holter monitoring systems.

10.
Sensors (Basel) ; 17(11)2017 Oct 25.
Article En | MEDLINE | ID: mdl-29068362

Noise and artifacts are inherent contaminating components and are particularly present in Holter electrocardiogram (ECG) monitoring. The presence of noise is even more significant in long-term monitoring (LTM) recordings, as these are collected for several days in patients following their daily activities; hence, strong artifact components can temporarily impair the clinical measurements from the LTM recordings. Traditionally, the noise presence has been dealt with as a problem of non-desirable component removal by means of several quantitative signal metrics such as the signal-to-noise ratio (SNR), but current systems do not provide any information about the true impact of noise on the ECG clinical evaluation. As a first step towards an alternative to classical approaches, this work assesses the ECG quality under the assumption that an ECG has good quality when it is clinically interpretable. Therefore, our hypotheses are that it is possible (a) to create a clinical severity score for the effect of the noise on the ECG, (b) to characterize its consistency in terms of its temporal and statistical distribution, and (c) to use it for signal quality evaluation in LTM scenarios. For this purpose, a database of external event recorder (EER) signals is assembled and labeled from a clinical point of view for its use as the gold standard of noise severity categorization. These devices are assumed to capture those signal segments more prone to be corrupted with noise during long-term periods. Then, the ECG noise is characterized through the comparison of these clinical severity criteria with conventional quantitative metrics taken from traditional noise-removal approaches, and noise maps are proposed as a novel representation tool to achieve this comparison. Our results showed that neither of the benchmarked quantitative noise measurement criteria represent an accurate enough estimation of the clinical severity of the noise. A case study of long-term ECG is reported, showing the statistical and temporal correspondences and properties with respect to EER signals used to create the gold standard for clinical noise. The proposed noise maps, together with the statistical consistency of the characterization of the noise clinical severity, paves the way towards forthcoming systems providing us with noise maps of the noise clinical severity, allowing the user to process different ECG segments with different techniques and in terms of different measured clinical parameters.


Electrocardiography/methods , Algorithms , Artifacts , Electrocardiography/standards , Electrocardiography, Ambulatory , Humans , Signal-To-Noise Ratio
11.
Comput Methods Programs Biomed ; 145: 147-155, 2017 Jul.
Article En | MEDLINE | ID: mdl-28552120

BACKGROUND AND OBJECTIVE: T-wave alternans (TWA) is a fluctuation of the ST-T complex occurring on an every-other-beat basis of the surface electrocardiogram (ECG). It has been shown to be an informative risk stratifier for sudden cardiac death, though the lack of gold standard to benchmark detection methods has promoted the use of synthetic signals. This work proposes a novel signal model to study the performance of a TWA detection. Additionally, the methodological validation of a denoising technique based on empirical mode decomposition (EMD), which is used here along with the spectral method, is also tackled. METHODS: The proposed test bed system is based on the following guidelines: (1) use of open source databases to enable experimental replication; (2) use of real ECG signals and physiological noise; (3) inclusion of randomized TWA episodes. Both sensitivity (Se) and specificity (Sp) are separately analyzed. Also a nonparametric hypothesis test, based on Bootstrap resampling, is used to determine whether the presence of the EMD block actually improves the performance. RESULTS: The results show an outstanding specificity when the EMD block is used, even in very noisy conditions (0.96 compared to 0.72 for SNR = 8 dB), being always superior than that of the conventional SM alone. Regarding the sensitivity, using the EMD method also outperforms in noisy conditions (0.57 compared to 0.46 for SNR=8 dB), while it decreases in noiseless conditions. CONCLUSIONS: The proposed test setting designed to analyze the performance guarantees that the actual physiological variability of the cardiac system is reproduced. The use of the EMD-based block in noisy environment enables the identification of most patients with fatal arrhythmias.


Arrhythmias, Cardiac/diagnosis , Electrocardiography/standards , Benchmarking , Humans , Sensitivity and Specificity
12.
Front Physiol ; 7: 82, 2016.
Article En | MEDLINE | ID: mdl-27014083

Great effort has been devoted in recent years to the development of sudden cardiac risk predictors as a function of electric cardiac signals, mainly obtained from the electrocardiogram (ECG) analysis. But these prediction techniques are still seldom used in clinical practice, partly due to its limited diagnostic accuracy and to the lack of consensus about the appropriate computational signal processing implementation. This paper addresses a three-fold approach, based on ECG indices, to structure this review on sudden cardiac risk stratification. First, throughout the computational techniques that had been widely proposed for obtaining these indices in technical literature. Second, over the scientific evidence, that although is supported by observational clinical studies, they are not always representative enough. And third, via the limited technology transfer of academy-accepted algorithms, requiring further meditation for future systems. We focus on three families of ECG derived indices which are tackled from the aforementioned viewpoints, namely, heart rate turbulence (HRT), heart rate variability (HRV), and T-wave alternans. In terms of computational algorithms, we still need clearer scientific evidence, standardizing, and benchmarking, siting on advanced algorithms applied over large and representative datasets. New scenarios like electronic health recordings, big data, long-term monitoring, and cloud databases, will eventually open new frameworks to foresee suitable new paradigms in the near future.

13.
Physiol Meas ; 36(9): 1981-94, 2015 Sep.
Article En | MEDLINE | ID: mdl-26260978

The aim of electrocardiogram (ECG) compression is to reduce the amount of data as much as possible while preserving the significant information for diagnosis. Objective metrics that are derived directly from the signal are suitable for controlling the quality of the compressed ECGs in practical applications. Many approaches have employed figures of merit based on the percentage root mean square difference (PRD) for this purpose. The benefits and drawbacks of the PRD measures, along with other metrics for quality assessment in ECG compression, are analysed in this work. We propose the use of the root mean square error (RMSE) for quality control because it provides a clearer and more stable idea about how much the retrieved ECG waveform, which is the reference signal for establishing diagnosis, separates from the original. For this reason, the RMSE is applied here as the target metric in a thresholding algorithm that relies on the retained energy. A state of the art compressor based on this approach, and its PRD-based counterpart, are implemented to test the actual capabilities of the proposed technique. Both compression schemes are employed in several experiments with the whole MIT-BIH Arrhythmia Database to assess both global and local signal distortion. The results show that, using the RMSE for quality control, the distortion of the reconstructed signal is better controlled without reducing the compression ratio.


Data Compression/methods , Electrocardiography/methods , Algorithms , Arrhythmias, Cardiac/physiopathology , Data Compression/standards , Databases, Factual , Electrocardiography/standards , Quality Control
14.
IEEE J Biomed Health Inform ; 19(2): 508-19, 2015 Mar.
Article En | MEDLINE | ID: mdl-24846672

Recent results in telecardiology show that compressed sensing (CS) is a promising tool to lower energy consumption in wireless body area networks for electrocardiogram (ECG) monitoring. However, the performance of current CS-based algorithms, in terms of compression rate and reconstruction quality of the ECG, still falls short of the performance attained by state-of-the-art wavelet-based algorithms. In this paper, we propose to exploit the structure of the wavelet representation of the ECG signal to boost the performance of CS-based methods for compression and reconstruction of ECG signals. More precisely, we incorporate prior information about the wavelet dependencies across scales into the reconstruction algorithms and exploit the high fraction of common support of the wavelet coefficients of consecutive ECG segments. Experimental results utilizing the MIT-BIH Arrhythmia Database show that significant performance gains, in terms of compression rate and reconstruction quality, can be obtained by the proposed algorithms compared to current CS-based methods.


Data Compression/methods , Electrocardiography/methods , Algorithms , Databases, Factual , Humans , Remote Sensing Technology , Wavelet Analysis , Wireless Technology
15.
Physiol Meas ; 33(7): 1237-47, 2012 Jul.
Article En | MEDLINE | ID: mdl-22735392

Coding distortion in lossy electroencephalographic (EEG) signal compression methods is evaluated through tractable objective criteria. The percentage root-mean-square difference, which is a global and relative indicator of the quality held by reconstructed waveforms, is the most widely used criterion. However, this parameter does not ensure compliance with clinical standard guidelines that specify limits to allowable noise in EEG recordings. As a result, expert clinicians may have difficulties interpreting the resulting distortion of the EEG for a given value of this parameter. Conversely, the root-mean-square error is an alternative criterion that quantifies distortion in understandable units. In this paper, we demonstrate that the root-mean-square error is better suited to control and to assess the distortion introduced by compression methods. The experiments conducted in this paper show that the use of the root-mean-square error as target parameter in EEG compression allows both clinicians and scientists to infer whether coding error is clinically acceptable or not at no cost for the compression ratio.


Data Compression/methods , Electroencephalography/methods , Statistics as Topic/methods , Adolescent , Child , Child, Preschool , Databases as Topic , Female , Humans , Infant , Male , Young Adult
16.
Med Eng Phys ; 34(7): 892-9, 2012 Sep.
Article En | MEDLINE | ID: mdl-22056794

The recent use of long-term records in electroencephalography is becoming more frequent due to its diagnostic potential and the growth of novel signal processing methods that deal with these types of recordings. In these cases, the considerable volume of data to be managed makes compression necessary to reduce the bit rate for transmission and storage applications. In this paper, a new compression algorithm specifically designed to encode electroencephalographic (EEG) signals is proposed. Cosine modulated filter banks are used to decompose the EEG signal into a set of subbands well adapted to the frequency bands characteristic of the EEG. Given that no regular pattern may be easily extracted from the signal in time domain, a thresholding-based method is applied for quantizing samples. The method of retained energy is designed for efficiently computing the threshold in the decomposition domain which, at the same time, allows the quality of the reconstructed EEG to be controlled. The experiments are conducted over a large set of signals taken from two public databases available at Physionet and the results show that the compression scheme yields better compression than other reported methods.


Electroencephalography/methods , Signal Processing, Computer-Assisted , Adolescent , Algorithms , Child , Child, Preschool , Databases, Factual , Entropy , Female , Humans , Infant , Male , Young Adult
17.
Article En | MEDLINE | ID: mdl-22254667

An innovative electrocardiogram compression algorithm is presented in this paper. The proposed method is based on matrix completion, a new paradigm in signal processing that seeks to recover a low-rank matrix based on a small number of observations. The low-rank matrix is obtained via normalization of electrocardiogram records. Using matrix completion, the ECG data matrix is recovered from a few number of entries, thereby yielding high compression ratios comparable to those obtained by existing compression techniques. The proposed scheme offers a low-complexity encoder, good tolerance to quantization noise, and good quality reconstruction.


Algorithms , Data Compression/methods , Diagnosis, Computer-Assisted/methods , Electrocardiography/methods , Signal Processing, Computer-Assisted , Humans , Reproducibility of Results , Sensitivity and Specificity
18.
Article En | MEDLINE | ID: mdl-22255966

Due to the large volume of information generated in an electroencephalographic (EEG) study, compression is needed for storage, processing or transmission for analysis. In this paper we evaluate and compare two lossy compression techniques applied to EEG signals. It compares the performance of compression schemes with decomposition by filter banks or wavelet Packets transformation, seeking the best value for compression, best quality and more efficient real time implementation. Due to specific properties of EEG signals, we propose a quantization stage adapted to the dynamic range of each band, looking for higher quality. The results show that the compressor with filter bank performs better than transform methods. Quantization adapted to the dynamic range significantly enhances the quality.


Electroencephalography/instrumentation , Electroencephalography/methods , Signal Processing, Computer-Assisted , Algorithms , Computers , Data Compression , Humans , Models, Statistical , Polysomnography/instrumentation , Polysomnography/methods , Reproducibility of Results , Software , Wavelet Analysis
19.
Article En | MEDLINE | ID: mdl-22255969

The aim of electrocardiogram (ECG) compression is to achieve as much compression as possible while the significant information is preserved in the reconstructed signal. Lossy thresholding-based compressors have shown good performance needing low computational resources. In this work, two compression schemes that include nearly perfect reconstruction cosine modulated filter banks for the signal decomposition are proposed. They are evaluated for highly reliable applications, where the reconstructed signal must be very similar to the original. The whole MIT-BIH Arrhythmia Database and suitable metrics are used in the assessment, to obtain representative results. Results show that the proposed compressors yield better performance than discrete wavelet transform-based techniques, when high quality requirements are imposed.


Data Compression/methods , Electrocardiography/methods , Signal Processing, Computer-Assisted , Algorithms , Arrhythmias, Cardiac/physiopathology , Computers , Humans , Models, Statistical , Reproducibility of Results , Software , Wavelet Analysis
20.
IEEE Trans Biomed Eng ; 57(10): 2402-12, 2010 Oct.
Article En | MEDLINE | ID: mdl-20409985

Repolarization alternans or T-wave alternans (TWA) is a subject of great interest as it has been shown as a risk stratifier for sudden cardiac death. As TWA consists of subtle and nonvisible variations of the ST-T complex, its detection may become more difficult in noisy environments, such as stress testing or Holter recordings. In this paper, a technique based on the empirical-mode decomposition (EMD) to separate the useful information of the ST-T complex from noise and artifacts is proposed. The identification of the useful part of the signal is based on the study of complexity in the EMD domain by means of the Hjorth descriptors. As a result, a robust technique to extract the trend of the ST-T complex has been achieved. The evaluation of the method is carried out with the spectral method (SM) over several public domain databases with ECGs sampled at different frequencies. The results show that the SM with the proposed technique outperforms the traditional SM by more than 2 dB. Also, the robustness of this technique is guaranteed as it does not introduce any additional distortion to the detector in noiseless conditions.


Electrocardiography/methods , Models, Cardiovascular , Signal Processing, Computer-Assisted , Algorithms , Artifacts , Computer Simulation , Databases, Factual , Heart Ventricles/physiopathology , Humans , Nonlinear Dynamics
...